Efficient multiple incremental computation for Kernel Ridge Regression with Bayesian uncertainty modeling

نویسندگان

  • Bo-Wei Chen
  • Nik Nailah Binti Abdullah
  • Sang Oh Park
  • Y. Gu
چکیده

 Abstract—This study presents an energy-economic approach for incremental/decremental learning based on kernel ridge regression, a frequently used regressor on clouds. To avoid reanalyzing the entire dataset when data change every time, the proposed mechanism supports incremental/decremental processing for both single and multiple samples (i.e., batch processing). Moreover, incremental/decremental analyses in empirical and intrinsic space are also introduced to handle with data matrices with a large number of samples or feature dimensions. At the end of this study, we further the proposed mechanism to statistical Kernelized Bayesian Regression, so that incremental/decremental analyses become applicable. Experimental results showed that the performance in accuracy of the proposed method remained as well as the original nonincremental design. Furthermore, training time and power consumption were significantly reduced. These findings thereby demonstrate the effectiveness of the proposed method.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Regression from patch kernel pdf

In this paper, we present a patch-based regression framework for addressing the human. Finally, kernel regression is employed for ultimate human age or head pose. For better viewing, please see the pdf file. In this paper, we present a patch-based regression framework for addressing the human age and head pose estimation problems. Firstly, each image is.via Kernel Partial Least Squares Regressi...

متن کامل

The Stream Algorithm: Computationally Efficient Ridge-Regression via Bayesian Model Averaging, and Applications to Pharmacogenomic Prediction of Cancer Cell Line Sensitivity

Computational efficiency is important for learning algorithms operating in the "large p, small n" setting. In computational biology, the analysis of data sets containing tens of thousands of features ("large p"), but only a few hundred samples ("small n"), is nowadays routine, and regularized regression approaches such as ridge-regression, lasso, and elastic-net are popular choices. In this pap...

متن کامل

Subspace Information Criterion for Infinite Dimensional Hypothesis Spaces

A central problem in learning is to select an appropriate model. This is typically done by estimating the unknown generalization errors of a set of models to be selected from and by then choosing the model with minimal generalization error estimate. In this article, we discuss the problem of model selection and generalization error estimation in the context of kernel regression models, e.g., ke...

متن کامل

Bayesian Generalized Kernel Models

We propose a fully Bayesian approach for generalized kernel models (GKMs), which are extensions of generalized linear models in the feature space induced by a reproducing kernel. We place a mixture of a point-mass distribution and Silverman’s g-prior on the regression vector of GKMs. This mixture prior allows a fraction of the regression vector to be zero. Thus, it serves for sparse modeling an...

متن کامل

Recursion-Free Online Multiple Incremental/Decremental Analysis Based on Ridge Support Vector Learning

 Abstract—This study presents a rapid multiple incremental and decremental mechanism based on Weight-Error Curves (WECs) for support-vector analysis. To handle rapidly increasing amounts of data, recursion-free computation is proposed for predicting the Lagrangian multipliers of new samples. This study examines the characteristics of Ridge Support Vector Models, including Ridge Support Vector ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Future Generation Comp. Syst.

دوره 82  شماره 

صفحات  -

تاریخ انتشار 2018